Sherpa: Robust hyperparameter optimization for machine learning
نویسندگان
چکیده
منابع مشابه
Applying Model-Based Optimization to Hyperparameter Optimization in Machine Learning
This talk will cover the main components of sequential modelbased optimization algorithms. Algorithms of this kind represent the state-of-the-art for expensive black-box optimization problems and are getting increasingly popular for hyper-parameter optimization of machine learning algorithms, especially on larger data sets. The talk will cover the main components of sequential model-based optim...
متن کاملHyperparameter Search in Machine Learning
Abstract We describe the hyperparameter search problem in the field of machine learning and discuss its main challenges from an optimization perspective. Machine learning methods attempt to build models that capture some element of interest based on given data. Most common learning algorithms feature a set of hyperparameters that must be determined before training commences. The choice of hyper...
متن کاملBayesian Hyperparameter Optimization for Ensemble Learning
In this paper, we bridge the gap between hyperparameter optimization and ensemble learning by performing Bayesian optimization of an ensemble with regards to its hyperparameters. Our method consists in building a fixed-size ensemble, optimizing the configuration of one classifier of the ensemble at each iteration of the hyperparameter optimization algorithm, taking into consideration the intera...
متن کاملHyperparameter Learning in Robust Soft LVQ
We present a technique to extend Robust Soft Learning Vector Quantization (RSLVQ). This algorithm is derived from an explicit cost function and follows the dynamics of a stochastic gradient ascent. The RSLVQ cost function involves a hyperparameter which is kept fixed during training. We propose to adapt the hyperparameter based on the gradient information. Experiments on artificial and real lif...
متن کاملInitializing Bayesian Hyperparameter Optimization via Meta-Learning
Model selection and hyperparameter optimization is crucial in applying machine learning to a novel dataset. Recently, a subcommunity of machine learning has focused on solving this problem with Sequential Model-based Bayesian Optimization (SMBO), demonstrating substantial successes in many applications. However, for computationally expensive algorithms the overhead of hyperparameter optimizatio...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SoftwareX
سال: 2020
ISSN: 2352-7110
DOI: 10.1016/j.softx.2020.100591